Montague Meets Markov: Deep Semantics with Probabilistic Logical Form
نویسندگان
چکیده
We combine logical and distributional representations of natural language meaning by transforming distributional similarity judgments into weighted inference rules using Markov Logic Networks (MLNs). We show that this framework supports both judging sentence similarity and recognizing textual entailment by appropriately adapting the MLN implementation of logical connectives. We also show that distributional phrase similarity, used as textual inference rules created on the fly, improves its performance.
منابع مشابه
Towards a Type-Theoretical Account of Lexical Semantics
After a quick overview of the field of study known as “Lexical Semantics”, where we advocate the need of accessing additional information besides syntax and Montaguestyle semantics at the lexical level in order to complete the full analysis of an utterance, we summarize the current formulations of a well-known theory of that field. We then propose and justify our own model of the Generative Lex...
متن کاملType - Logical Semantics ∗ Reinhard Muskens
Type-logical semantics studies linguistic meaning with the help of the theory of types. The latter originated with Russell as an answer to the paradoxes, but has the additional virtue that it is very close to ordinary language. In fact, type theory is so much more similar to language than predicate logic is, that adopting it as a vehicle of representation can overcome the mismatches between gra...
متن کاملOn the Relationship between Logical Bayesian Networks and Probabilistic Logic Programming Based on the Distribution Semantics
A significant part of current research on ILP deals with probabilistic logical models. Over the last decade many logics or languages for representing such models have been introduced. There is currently a great need for insight into the relationships between all these languages. One class of languages are those that extend probabilistic models with elements of logic, such as in the language of ...
متن کاملIntegrating Logical Representations with Probabilistic Information using Markov Logic
First-order logic provides a powerful and flexible mechanism for representing natural language semantics. However, it is an open question of how best to integrate it with uncertain, probabilistic knowledge, for example regarding word meaning. This paper describes the first steps of an approach to recasting first-order semantics into the probabilistic models that are part of Statistical Relation...
متن کاملUTexas: Natural Language Semantics using Distributional Semantics and Probabilistic Logic
We represent natural language semantics by combining logical and distributional information in probabilistic logic. We use Markov Logic Networks (MLN) for the RTE task, and Probabilistic Soft Logic (PSL) for the STS task. The system is evaluated on the SICK dataset. Our best system achieves 73% accuracy on the RTE task, and a Pearson’s correlation of 0.71 on the STS task.
متن کامل